Bayesian ODE solvers: the maximum a posteriori estimate

نویسندگان

چکیده

Abstract There is a growing interest in probabilistic numerical solutions to ordinary differential equations. In this paper, the maximum posteriori estimate studied under class of $$\nu $$ ν times differentiable linear time-invariant Gauss–Markov priors, which can be computed with an iterated extended Kalman smoother. The corresponds optimal interpolant reproducing kernel Hilbert space associated prior, present case equivalent Sobolev smoothness +1$$ + 1 . Subject mild conditions on vector field, convergence rates are then obtained via methods from nonlinear analysis and scattered data approximation. These results closely resemble classical sense that prior process obtains global order , demonstrated examples.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Active Uncertainty Calibration in Bayesian ODE Solvers

There is resurging interest, in statistics and machine learning, in solvers for ordinary differential equations (ODEs) that return probability measures instead of point estimates. Recently, Conrad et al. introduced a sampling-based class of methods that are ‘well-calibrated’ in a specific sense. But the computational cost of these methods is significantly above that of classic methods. On the o...

متن کامل

Bayesian Maximum a Posteriori Multiple Testing Procedure

We consider a Bayesian approach to multiple hypothesis testing. A hierarchical prior model is based on imposing a prior distribution π(k) on the number of hypotheses arising from alternatives (false nulls). We then apply the maximum a posteriori (MAP) rule to find the most likely configuration of null and alternative hypotheses. The resulting MAP procedure and its closely related step-up and st...

متن کامل

Neural Network Classifiers Estimate Bayesian a posteriori Probabilities

Many neural network classifiers provide outputs which estimate Bayesian a posteriori probabilities. When the estimation is accurate, network outputs can be treated as probabilities and sum to one. Simple proofs show that Bayesian probabilities are estimated when desired network outputs are 2 of M (one output unity, all others zero) and a squarederror or cross-entropy cost function is used. Resu...

متن کامل

Disparity Estimation Based on Bayesian Maximum A Posteriori (MAP) Algorithm∗

In this paper, a general formula of disparity estimation based on Bayesian Maximum A Posteriori (MAP) algorithm is derived and implemented with simplified probabilistic models. The formula is the generalized probabilistic diffusion equation based on Bayesian model, and can be implemented into some different forms corresponding to the probabilistic models in the disparity neighborhood system or ...

متن کامل

Simple ODE Solvers - Error Behaviour

y(t0) = y0 Here f(t, y) is a given function, t0 is a given initial time and y0 is a given initial value for y. The unknown in the problem is the function y(t). Two obvious considerations in deciding whether or not a given algorithm is of any practical value are (a) the amount of computational effort required to execute the algorithm and (b) the accuracy that this computational effort yields. Fo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Statistics and Computing

سال: 2021

ISSN: ['0960-3174', '1573-1375']

DOI: https://doi.org/10.1007/s11222-021-09993-7